child architecture
- Asia > Taiwan (0.06)
- North America > United States (0.05)
- North America > Canada (0.05)
- Asia > Taiwan (0.05)
- Europe > Sweden > Stockholm > Stockholm (0.04)
- North America > United States > California > Los Angeles County > Long Beach (0.04)
- North America > Canada (0.04)
- Research Report (0.68)
- Instructional Material > Online (0.43)
Mitigating Forgetting in Online Continual Learning via Instance-A ware Parameterization (Supplemental) Hung-Jen Chen
Encourage controller to search unseen blocks by Eq. 9 Get reward r by Eq. 3 We conduct an ablation study to show the strength of count-based search exploration. We compare the performance difference between InstAParam with and without count-based exploration. Although, InstaNAS tries to solve the problem with "policy shuffling", we found that it does not solve the problem in this scenario. The detailed accuracy is listed in Table 2. CIFAR-10 and does not sacrifice the initial performance. First, we will focus on the distribution of the policy for each task.
- Asia > Taiwan (0.06)
- North America > United States (0.05)
- North America > Canada (0.05)
- Asia > Taiwan (0.05)
- Europe > Sweden > Stockholm > Stockholm (0.04)
- North America > United States > California > Los Angeles County > Long Beach (0.04)
- North America > Canada > British Columbia > Metro Vancouver Regional District > Vancouver (0.04)
InstaNAS: Instance-aware Neural Architecture Search
Cheng, An-Chieh, Lin, Chieh Hubert, Juan, Da-Cheng, Wei, Wei, Sun, Min
Neural Architecture Search (NAS) aims at finding one "single" architecture that achieves the best accuracy for a given task such as image recognition.In this paper, we study the instance-level variation,and demonstrate that instance-awareness is an important yet currently missing component of NAS. Based on this observation, we propose InstaNAS for searching toward instance-level architectures;the controller is trained to search and form a "distribution of architectures" instead of a single final architecture. Then during the inference phase, the controller selects an architecture from the distribution, tailored for each unseen image to achieve both high accuracy and short latency. The experimental results show that InstaNAS reduces the inference latency without compromising classification accuracy. On average, InstaNAS achieves 48.9% latency reduction on CIFAR-10 and 40.2% latency reduction on CIFAR-100 with respect to MobileNetV2 architecture.
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.05)
- Asia > Taiwan (0.04)
- Oceania > Australia > New South Wales > Sydney (0.04)
- (3 more...)